Sparse N-way partial least squares by L1-penalization
نویسندگان
چکیده
منابع مشابه
Sparse Orthonormalized Partial Least Squares
Orthonormalized partial least squares (OPLS) is often used to find a low-rank mapping between inputs X and outputs Y by estimating loading matrices A and B. In this paper, we introduce sparse orthonormalized PLS as an extension of conventional PLS that finds sparse estimates of A through the use of the elastic net algorithm. We apply sparse OPLS to the reconstruction of presented images from BO...
متن کاملBayesian Sparse Partial Least Squares
Partial least squares (PLS) is a class of methods that makes use of a set of latent or unobserved variables to model the relation between (typically) two sets of input and output variables, respectively. Several flavors, depending on how the latent variables or components are computed, have been developed over the last years. In this letter, we propose a Bayesian formulation of PLS along with s...
متن کاملRecursive N-Way Partial Least Squares for Brain-Computer Interface
In the article tensor-input/tensor-output blockwise Recursive N-way Partial Least Squares (RNPLS) regression is considered. It combines the multi-way tensors decomposition with a consecutive calculation scheme and allows blockwise treatment of tensor data arrays with huge dimensions, as well as the adaptive modeling of time-dependent processes with tensor variables. In the article the numerical...
متن کاملSparse representation of cast shadows via l1-regularized least squares
Scenes with cast shadows can produce complex sets of images. These images cannot be well approximated by lowdimensional linear subspaces. However, in this paper we show that the set of images produced by a Lambertian scene with cast shadows can be efficiently represented by a sparse set of images generated by directional light sources. We first model an image with cast shadows as composed of a ...
متن کاملSparse Gaussian Process Regression via L1 Penalization
To handle massive data, a variety of sparse Gaussian Process (GP) methods have been proposed to reduce the computational cost. Many of them essentially map the large dataset into a small set of basis points. A common approach to learn these basis points is evidence maximization. Nevertheless, evidence maximization may lead to overfitting and cause a high computational cost. In this paper, we pr...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Chemometrics and Intelligent Laboratory Systems
سال: 2019
ISSN: 0169-7439
DOI: 10.1016/j.chemolab.2019.01.004